منابع مشابه
Extrapolation of Visual Motion for Manual Interception Abbreviated Title: Motion Extrapolation for Interception
A frequent goal of hand movement is to touch a moving target, or to make contact with a stationary object which is in motion relative to the moving head and body. This process requires a prediction of the target's motion, since the initial direction of the hand movement anticipates target motion. This experiment was designed to define the visual motion parameters that are incorporated in this p...
متن کاملInterpolation and extrapolation of motion capture data
The author proposes a computer-graphic animation tool through which the relationship between human motions and subjective impressions can be quantitatively examined. In the system, the three-dimensional human motion was measured with a wireless magnetic motion capture system with a pair of data gloves. The time-series changes of each joint angle were estimated through the measured data, using a...
متن کاملExtrapolation of visual motion for manual interception.
A frequent goal of hand movement is to touch a moving target or to make contact with a stationary object that is in motion relative to the moving head and body. This process requires a prediction of the target's motion, since the initial direction of the hand movement anticipates target motion. This experiment was designed to define the visual motion parameters that are incorporated in this pre...
متن کاملMotion extrapolation of auditory-visual targets
1566-2535/$ see front matter 2009 Elsevier B.V. A doi:10.1016/j.inffus.2009.04.005 * Corresponding author. Tel.: +44 151 794 2173; fa E-mail address: [email protected] (S. W Many tasks involve the precise estimation of speed and position of moving objects, for instance to catch or avoid objects that cohabit in our environment. Many of these objects are characterised by signal represen...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Vision
سال: 2019
ISSN: 1534-7362
DOI: 10.1167/19.10.194a